Weighted kappa is higher than Cohen’s kappa for tridiagonal agreement tables

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Cohen’s quadratically weighted kappa is higher than linearly weighted kappa for tridiagonal agreement tables

Cohen’s weighted kappa is a popular descriptive statistic for measuring the agreement between two raters on an ordinal scale. Popular weights for weighted kappa are the linear weights and the quadratic weights. It has been frequently observed in the literature that the value of the quadratically weighted kappa is higher than the value of the linearly weighted kappa. In this paper this phenomeno...

متن کامل

The Analysis of Ordinal Agreement beyond Weighted Kappa

The weighted kappa statistic has been used as an agreement index for ordinal data. Using data on the comparability of primary and proxy respondent reports of alcohol drinking frequency we show that the value of weighted kappa can be sensitive to the choice of weights. The distinction between association and agreement is clarified and it is shown that in some respects weighted kappa behaves more...

متن کامل

Cohen's linearly weighted kappa is a weighted average

An agreement table with n ∈N≥3 ordered categories can be collapsed into n−1 distinct 2×2 tables by combining adjacent categories. Vanbelle and Albert (Stat. Methodol. 6:157–163, 2009c) showed that the components of Cohen’s weighted kappa with linear weights can be obtained from these n−1 collapsed 2× 2 tables. In this paper we consider several consequences of this result. One is that the weight...

متن کامل

Kappa Test for Agreement Between Two Raters

Introduction This module computes power and sample size for the test of agreement between two raters using the kappa statistic. The power calculations are based on the results in Flack, Afifi, Lachenbruch, and Schouten (1988). Calculations are based on ratings for k categories from two raters or judges. You are able to vary category frequencies on a single run of the procedure to analyze a wide...

متن کامل

Understanding interobserver agreement: the kappa statistic.

Items such as physical exam findings, radiographic interpretations, or other diagnostic tests often rely on some degree of subjective interpretation by observers. Studies that measure the agreement between two or more observers should include a statistic that takes into account the fact that observers will sometimes agree or disagree simply by chance. The kappa statistic (or kappa coefficient) ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Statistical Methodology

سال: 2011

ISSN: 1572-3127

DOI: 10.1016/j.stamet.2010.09.004